6 research outputs found

    Get Your Foes Fooled: Proximal Gradient Split Learning for Defense Against Model Inversion Attacks on IoMT Data

    Get PDF
    The past decade has seen a rapid adoption of Artificial Intelligence (AI), specifically the deep learning networks, in Internet of Medical Things (IoMT) ecosystem. However, it has been shown recently that the deep learning networks can be exploited by adversarial attacks that not only make IoMT vulnerable to the data theft but also to the manipulation of medical diagnosis. The existing studies consider adding noise to the raw IoMT data or model parameters which not only reduces the overall performance concerning medical inferences but also is ineffective to the likes of deep leakage from gradients method. In this work, we propose proximal gradient split learning (PSGL) method for defense against the model inversion attacks. The proposed method intentionally attacks the IoMT data when undergoing the deep neural network training process at client side. We propose the use of proximal gradient method to recover gradient maps and a decision-level fusion strategy to improve the recognition performance. Extensive analysis show that the PGSL not only provides effective defense mechanism against the model inversion attacks but also helps in improving the recognition performance on publicly available datasets. We report 14.0 % , 17.9 % , and 36.9 % gains in accuracy over reconstructed and adversarial attacked images, respectively

    Stream-Based Authentication Strategy Using IoT Sensor Data in Multi-homing Sub-aqueous Big Data Network

    Get PDF
    Big data analytics has addressed many in-place and remote network issues in a sub-aqueous distributed computing environment. Recently, a new phenomenon is introduced in the data analytics clusters that focus on multi-homing network connectivity procedures among off-ground multiple nodes of the large-scale on-running wireless industrial applications. In this way, the clusters perform multi-layer cross-connected task processing among various networks simultaneously and perform stream based data block placement over multiple nodes in a sequential order. This satisfies the procedural performance of the cluster; however, security remains an open issue in it because of unavailability of inter-network data block processing authorization. In this paper, we propose a stream based authentication mechanism, that specifically addresses security concerns of multi-homing sub-aqueous big data networks and presents a key authorization infrastructure that performs a proper handing taking among multiple off-ground Datanodes before an inter-network data block exchange. The simulation results depict that our approach increases multi-homing network compatibility and reliability while processing a data block in the sub-aqueous distributed computing environment

    An Aggregate MapReduce Data Block Placement Strategy for Wireless IoT Edge Nodes in Smart Grid

    Get PDF
    Big data analytics has simplified processing complexity of large dataset in a distributed environment. Many state-of-the-art platforms i.e. smart grid has adopted the processing structure of big data and manages a large volume of data through MapReduce paradigm at distribution ends. Thus, whenever a wireless IoT edge node bundles a sensor dataset into storage media, MapReduce agent performs analytics and generates output into the grid repository. This practice has efficiently reduced the consumption of resources in such a giant network and strengthens other components of the smart grid to perform data analytics through aggregate programming. However, it consumes an operational latency of accessing large dataset from a central repository. As we know that, smart grid processes I/O operations of multi-homing networks, therefore, it accesses large datasets for processing MapReduce jobs at wireless IoT edge nodes. As a result, aggregate MapReduce at wireless IoT edge node produces a network congestion and operational latency problem. To overcome this issue, we propose Wireless IoT Edge-enabled Block Replica Strategy (WIEBRS), that stores in-place, partition-based and multi-homing block replica to respective edge nodes. This reduces the delay latency of accessing datasets for aggregate MapReduce and increases the performance of the job in the smart grid. The simulation results show that WIEBRS effective decreases operational latency with an increment of aggregate MapReduce job performance in the smart grid

    Comparative analysis of machine learning algorithms for prediction of smart grid stability

    No full text
    The global demand for electricity has visualized high growth with the rapid growth in population and economy. It thus becomes necessary to efficiently distribute electricity to households and industries in order to reduce power loss. Smart Grids (SG) have the potential to reduce such power losses during power distribution. Machine learning and artificial intelligence techniques have been successfully implemented on SGs to achieve enhanced accuracy in customer demand prediction. There exists a dire need to analyze and evaluate the various machine learning algorithms, thereby identify the most suitable one to be applied to SGs. In the present work, several state-of-the-art machine learning algorithms, namely Support Vector Machines (SVM), K-Nearest Neighbor (KNN), Logistic Regression, Naive Bayes, Neural Networks, and Decision Tree classifier, have been deployed for predicting the stability of the SG. The SG dataset used in the study is publicly available collected from UC Irvine (UCI) machine learning repository. The experimentation results highlighted the superiority of the Decision Tree classification algorithm, which outperformed the other state of the art algorithms yielding 100% precision, 99.9% recall, 100% F1 score, and 99.96% accuracy
    corecore